Multivariate distributions and independence
Random vectors and independence
- Definition 6.1 The joint distribution function of
the pair of random variables is the mapping given by 
- Joint distribution functions have certain elementary properties
- is non-increasing in each variable in that - if - and 
 
- Joint distribution functions tells us how and behave together.It is intuitively attractive to write and similarly, but the mathematically correct way of expressing this is These distribution functions are called the marginal distribution functions of the joint distribution function . 
- Definition 6.7 We call and independent if, for all , the events and are independent. 
- That is to say, and are independent if and only if 
- It is to say that their joint distribution function factorizes as
the product of the two marginal distribution functions: 
- We study families of random variables in very much the same way.
Joint density functions
- Definition 6.16 The pair of random variables on is called (jointly) continuous if its joint distribution function is expressible in the form for and some function . If this holds, we say that and have joint (probability) density function , and we usually denote this function by . 
- The elementary properties of the joint density function
- Theorem 6.22 If is any regular subset of and and are jointly continuous random variables with joint density function , then 
Marginal density functions and independence
- These density functions are called the marginal density
functions of and of and are independent if their distribution functions satisfy: if and only if 
- jointly continuous random variables are independent if and only if their joint density function factorizes as the product of the two marginal density functions
- Theorem 6.31 Jointly continuous random variables and are independent if and only if their joint density function may be expressed in the form 
Sums of continuous random variables
- Theorem 6.38 (Convolution formula) If the random
variables and are independent and continuous with density functions and , then the density function of is: In the language of analysis, it says that is the convolution of and , written . 
Changes of variables
- Theorem 6.50 (Jacobian formula):Let - and - be jointly continuous with joint density function - , and let - . If the mapping - given by - is a bijection from - to the set - , then (subject to the previous conditions) the pair - is jointly continuous with joint density function - # Conditional density functions 
- Definition 6.57 The conditional density function of - given that - is denoted by - and defined by - for - and - satisfying - . 
Expectations of continuous random variables
- Theorem 6.62 We have that whenever this integral converges absolutely 
- Theorem 6.66 Jointly continuous random variables and are independent if and only if for all functions for which these expectations exist. 
- Definition 6.68 The conditional expectation of
given , written , is the mean of the conditional density function, 
valid for any value of 
Bivariate normal distribution
- the joint density function of the standard bivariate normal
(or Gaussian) distribution 
- Marginals We conclude that X has the normal distribution with mean 0 and variance 1. 
- Conditional density function: and so the conditional distribution of given is the normal distribution with mean and variance . 
- Conditional expectation: 
- Independence:and are independent if and only if 
- bivariate normal distribution:
     推荐阅读
    
    
   